Bregman iterative regularization using model functions for nonconvex nonsmooth optimization

نویسندگان

چکیده

In this paper, we propose a new algorithm called ModelBI by blending the Bregman iterative regularization method and model function technique for solving class of nonconvex nonsmooth optimization problems. On one hand, use technique, which is essentially first-order approximation to objective function, go beyond traditional Lipschitz gradient continuity. other generate solutions fitting certain structures. Theoretically, show global convergence proposed with help Kurdyka-Łojasiewicz property. Finally, consider two kinds phase retrieval problems an explicit iteration scheme. Numerical results verify illustrate potential our algorithm.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Stochastic Methods for Nonsmooth Nonconvex Optimization

We analyze stochastic algorithms for optimizing nonconvex, nonsmooth finite-sum problems, where the nonconvex part is smooth and the nonsmooth part is convex. Surprisingly, unlike the smooth case, our knowledge of this fundamental problem is very limited. For example, it is not known whether the proximal stochastic gradient method with constant minibatch converges to a stationary point. To tack...

متن کامل

Image Restoration with Compound Regularization Using a Bregman Iterative Algorithm

Some imaging inverse problems may require the solution to simultaneously exhibit properties that are not enforceable by a single regularizer. One way to attain this goal is to use a linear combinations of regularizers, thus encouraging the solution to simultaneously exhibit the characteristics enforced by each individual regularizer. In this paper, we address the optimization problem resulting ...

متن کامل

A Class of Nonconvex Nonsmooth Approximate Potential Functions for Nonconvex Nonsmooth Image Restoration

Nonconvex nonsmooth potential functions have superior restoration performance for the images with neat boundaries. However, several difficulties emerge from the numerical computation. Thus the graduated nonconvex (GNC) method is suggested to deal with these problems. To improve the performance of the GNC method further, a class of nonconvex nonsmooth approximate potential functions have been co...

متن کامل

Stochastic Cubic Regularization for Fast Nonconvex Optimization

This paper proposes a stochastic variant of a classic algorithm—the cubic-regularized Newton method [Nesterov and Polyak, 2006]. The proposed algorithm efficiently escapes saddle points and finds approximate local minima for general smooth, nonconvex functions in only Õ( −3.5) stochastic gradient and stochastic Hessian-vector product evaluations. The latter can be computed as efficiently as sto...

متن کامل

Benson's algorithm for nonconvex multiobjective problems via nonsmooth Wolfe duality

‎In this paper‎, ‎we propose an algorithm to obtain an approximation set of the (weakly) nondominated points of nonsmooth multiobjective optimization problems with equality and inequality constraints‎. ‎We use an extension of the Wolfe duality to construct the separating hyperplane in Benson's outer algorithm for multiobjective programming problems with subdifferentiable functions‎. ‎We also fo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Frontiers in Applied Mathematics and Statistics

سال: 2022

ISSN: ['2297-4687']

DOI: https://doi.org/10.3389/fams.2022.1031039